143 research outputs found

    Space--Time Tradeoffs for Subset Sum: An Improved Worst Case Algorithm

    Full text link
    The technique of Schroeppel and Shamir (SICOMP, 1981) has long been the most efficient way to trade space against time for the SUBSET SUM problem. In the random-instance setting, however, improved tradeoffs exist. In particular, the recently discovered dissection method of Dinur et al. (CRYPTO 2012) yields a significantly improved space--time tradeoff curve for instances with strong randomness properties. Our main result is that these strong randomness assumptions can be removed, obtaining the same space--time tradeoffs in the worst case. We also show that for small space usage the dissection algorithm can be almost fully parallelized. Our strategy for dealing with arbitrary instances is to instead inject the randomness into the dissection process itself by working over a carefully selected but random composite modulus, and to introduce explicit space--time controls into the algorithm by means of a "bailout mechanism"

    When Constant-time Source Yields Variable-time Binary: Exploiting Curve25519-donna Built with MSVC 2015

    Get PDF
    The elliptic curve Curve25519 has been presented as pro- tected against state-of-the-art timing attacks [2]. This paper shows that a timing attack is still achievable against a particular X25519 implemen- tation which follows the RFC 7748 requirements [11]. The attack allows the retrieval of the complete private key used in the ECDH protocol. This is achieved due to timing leakage during Montgomery ladder execu- tion and relies on a conditional branch in the Windows runtime library 2015. The attack can be applied remotely

    Ternary Syndrome Decoding with Large Weight

    Get PDF
    The Syndrome Decoding problem is at the core of many code-based cryptosystems. In this paper, we study ternary Syndrome Decoding in large weight. This problem has been introduced in the Wave signature scheme but has never been thoroughly studied. We perform an algorithmic study of this problem which results in an update of the Wave parameters. On a more fundamental level, we show that ternary Syndrome Decoding with large weight is a really harder problem than the binary Syndrome Decoding problem, which could have several applications for the design of code-based cryptosystems

    Computational environment for modeling and analysing network traffic behaviour using the divide and recombine framework

    Get PDF
    There are two essential goals of this research. The first goal is to design and construct a computational environment that is used for studying large and complex datasets in the cybersecurity domain. The second goal is to analyse the Spamhaus blacklist query dataset which includes uncovering the properties of blacklisted hosts and understanding the nature of blacklisted hosts over time. The analytical environment enables deep analysis of very large and complex datasets by exploiting the divide and recombine framework. The capability to analyse data in depth enables one to go beyond just summary statistics in research. This deep analysis is at the highest level of granularity without any compromise on the size of the data. The environment is also, fully capable of processing the raw data into a data structure suited for analysis. Spamhaus is an organisation that identifies malicious hosts on the Internet. Information about malicious hosts are stored in a distributed database by Spamhaus and served through the DNS protocol query-response. Spamhaus and other malicious-host-blacklisting organisations have replaced smaller malicious host databases curated independently by multiple organisations for their internal needs. Spamhaus services are popular due to their free access, exhaustive information, historical information, simple DNS based implementation, and reliability. The malicious host information obtained from these databases are used in the first step of weeding out potentially harmful hosts on the internet. During the course of this research work a detailed packet-level analysis was carried out on the Spamhaus blacklist data. It was observed that the query-responses displayed some peculiar behaviours. These anomalies were studied and modeled, and identified to be showing definite patterns. These patterns are empirical proof of a systemic or statistical phenomenon

    New Attacks on RSA with Small Secret CRT-Exponents

    Full text link

    Square root algorithms for the number field sieve

    Get PDF
    The original publication is available at www.springerlink.comInternational audienceWe review several methods for the square root step of the Number Field Sieve, and present an original one, based on the Chinese Remainder Theorem

    Rounding and Chaining LLL: Finding Faster Small Roots of Univariate Polynomial Congruences

    Get PDF
    International audienceIn a seminal work at EUROCRYPT '96, Coppersmith showed how to find all small roots of a univariate polynomial congruence in polynomial time: this has found many applications in public-key cryptanalysis and in a few security proofs. However, the running time of the algorithm is a high-degree polynomial, which limits experiments: the bottleneck is an LLL reduction of a high-dimensional matrix with extra-large coefficients. We present in this paper the first significant speedups over Coppersmith's algorithm. The first speedup is based on a special property of the matrices used by Coppersmith's algorithm, which allows us to provably speed up the LLL reduction by rounding, and which can also be used to improve the complexity analysis of Coppersmith's original algorithm. The exact speedup depends on the LLL algorithm used: for instance, the speedup is asymptotically quadratic in the bit-size of the small-root bound if one uses the Nguyen-Stehlé L2 algorithm. The second speedup is heuristic and applies whenever one wants to enlarge the root size of Coppersmith's algorithm by exhaustive search. Instead of performing several LLL reductions independently, we exploit hidden relationships between these matrices so that the LLL reductions can be somewhat chained to decrease the global running time. When both speedups are combined, the new algorithm is in practice hundreds of times faster for typical parameters

    Twisting Lattice and Graph Techniques to Compress Transactional Ledgers

    Get PDF
    International audienceKeeping track of financial transactions (e.g., in banks and blockchains) means keeping track of an ever-increasing list of exchanges between accounts. In fact, many of these transactions can be safely " forgotten " , in the sense that purging a set of them that compensate each other does not impact the network's semantic meaning (e.g., the accounts' balances). We call nilcatenation a collection of transactions having no effect on a network's semantics. Such exchanges may be archived and removed, yielding a smaller, but equivalent ledger. Motivated by the computational and analytic benefits obtained from more compact representations of numerical data, we formalize the problem of finding nilcatenations, and propose detection methods based on graph and lattice-reduction techniques. Atop interesting applications of this work (e.g., decoupling of centralized and distributed databases), we also discuss the original idea of a " community-serving proof of work " : finding nilcatenations constitutes a proof of useful work, as the periodic removal of nilcatenations reduces the transactional graph's size

    Cryptanalysis of an NTRU-based Proxy Encryption Scheme from ASIACCS\u2715

    Get PDF
    In ASIACCS 2015, Nuñez, Agudo, and Lopez proposed a proxy re-encryption scheme, NTRUReEncrypt, based on NTRU, which allows a proxy to translate ciphertext under the delegator\u27s public key into a re-encrypted ciphertext that can be decrypted correctly by delegatee\u27s private key. In addition to its potential resistance to quantum algorithm, the scheme was also considered to be efficient. However, in this paper we point out that the re-encryption process will increase the decryption error, and the increased decryption error will lead to a reaction attack that enables the proxy to recover the private key of the delegator and the delegatee. Moreover, we also propose a second attack which enables the delegatee to recover the private key of the delegator when he collects enough re-encrypted ciphertexts from a same message. We reevaluate the security of NTRUReEncrypt, and also give suggestions and discussions on potential mitigation methods
    • …
    corecore